Spatiotemporal Gait Measurement With a Side-View Depth Sensor Using Human Joint Proposals

نویسندگان

چکیده

We propose a method for calculating standard spatiotemporal gait parameters from individual human joints with side-view depth sensor. Clinical walking trials were measured concurrently by Kinect and pressure-sensitive walkway, the Zeno Walkway. Multiple joint proposals generated images stochastic predictor based on algorithm. The are represented as vertices in weighted graph, where weights depend expected lengths between body parts. A shortest path through graph is set of head to foot. Accurate foot positions selected comparing pairs paths. Stance phases feet detected examining motion over time. stance used calculate four parameters: stride length, step width, percentage. constant frame rate was assumed calculation percentage because time stamps not captured during experiment. Gait 52 compared ground truth walkway using Bland-Altman analysis intraclass correlation coefficients. large spatial had strongest agreements (ICC(2, 1) = 1.00 0.98 length normal pace, respectively). presented system directly calculates while previous systems relied indirect measures. Using allows tracking both directions one camera, extending range which subject field view.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Cross View Gait Recognition Using Joint-Direct Linear Discriminant Analysis

This paper proposes a view-invariant gait recognition framework that employs a unique view invariant model that profits from the dimensionality reduction provided by Direct Linear Discriminant Analysis (DLDA). The framework, which employs gait energy images (GEIs), creates a single joint model that accurately classifies GEIs captured at different angles. Moreover, the proposed framework also he...

متن کامل

High Accuracy Depth Measurement using Multi-view Stereo

A novel scheme for depth extraction is achieved using a multiple view ring camera system. The ring camera method captures a series of images of scene from a set of camera locations arranged in a circular ring. Tracking of scene features through this sequence realizes circular feature trajectories. The recovery of depth can be obtained with this method by determining the diameter of the circular...

متن کامل

Analyzing gait with spatiotemporal surfaces

Human motions generate characteristic spatiotemporal patterns. We have developed a set of techniques for analyzing the patterns generated by people walking across the eld of view. After change detection, the XYT pattern can be t with a smooth spatiotemporal surface. This surface is approximately periodic, re ecting the periodicity of the gait. The surface can be expressed as a combination of a ...

متن کامل

Kinect as a Tool for Gait Analysis: Validation of a Real-Time Joint Extraction Algorithm Working in Side View

The Microsoft Kinect sensor has gained attention as a tool for gait analysis for several years. Despite the many advantages the sensor provides, however, the lack of a native capability to extract joints from the side view of a human body still limits the adoption of the device to a number of relevant applications. This paper presents an algorithm to locate and estimate the trajectories of up t...

متن کامل

A Multifunctional Joint Angle Sensor with Measurement Adaptability

The paper presents a multifunctional joint sensor with measurement adaptability for biological engineering applications, such as gait analysis, gesture recognition, etc. The adaptability is embodied in both static and dynamic environment measurements, both of body pose and in motion capture. Its multifunctional capabilities lay in its ability of simultaneous measurement of multiple degrees of f...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Journal of Biomedical and Health Informatics

سال: 2021

ISSN: ['2168-2208', '2168-2194']

DOI: https://doi.org/10.1109/jbhi.2020.3024925